Learn R Programming

ks (version 1.4.4)

kda, Hkda, Hkda.diag: Kernel discriminant analysis for multivariate data

Description

Kernel discriminant analysis for 1- to 6-dimensional data.

Usage

kda(x, x.group, Hs, hs, y, prior.prob=NULL)

Hkda(x, x.group, Hstart, bw="plugin", nstage=2, pilot="samse", pre="sphere", binned=FALSE) Hkda.diag(x, x.group, bw="plugin", nstage=2, pilot="samse", pre="sphere", binned=FALSE)

Arguments

Value

  • -- The result from kda is a vector of group labels estimated via the kernel discriminant rule. If the test data y are given then these are classified. Otherwise the training data x are classified.

    -- The result from Hkda and Hkda.diag is a stacked matrix of bandwidth matrices for each training data group.

    The values that valid for bw are "plugin", "lscv" and "scv" for Hkda. These in turn call Hpi, Hlscv and Hscv. For plugin selectors, all of nstage, pilot and pre need to be set. For SCV selectors, currently nstage=1 always but pilot and pre need to be set. For LSCV selectors, none of them are required.

    For Hkda.diag, options are "plugin" or "lscv" which in turn call respectively Hpi.diag and Hlscv.diag. Again, nstage, pilot and pre are available for Hpi.diag but not required for Hlscv.diag. For details on the pre-transformations in pre, see pre.sphere and pre.scale.

Details

-- If you have prior probabilities then set prior.prob to these. Otherwise prior.prob=NULL is the default i.e. use the sample proportions as estimates of the prior probabilities.

-- The values that valid for bw are "plugin", "lscv" and "scv" for Hkda. These in turn call Hpi, Hlscv and Hscv. For plugin selectors, all of nstage, pilot and pre need to be set. For SCV selectors, currently nstage=1 always but pilot and pre need to be set. For LSCV selectors, none of them are required.

For Hkda.diag, options are "plugin" or "lscv" which in turn call respectively Hpi.diag and Hlscv.diag. Again, nstage, pilot and pre are available for Hpi.diag but not required for Hlscv.diag.

References

Mardia, K.V., Kent, J.T. & Bibby J.M. (1979) Multivariate Analysis. Academic Press. London. Silverman, B. W. (1986) Data Analysis for Statistics and Data Analysis. Chapman & Hall. London. Simonoff, J. S. (1996) Smoothing Methods in Statistics. Springer-Verlag. New York

Venables, W.N. & Ripley, B.D. (1997) Modern Applied Statistics with S-PLUS. Springer-Verlag. New York.

See Also

compare, compare.kda.cv, kda.kde

Examples

Run this code
### univariate example -- independent test data
x <- c(rnorm.mixt(n=100, mus=1, sigmas=1, props=1),
       rnorm.mixt(n=100, mus=-1, sigmas=1, props=1))
x.gr <- rep(c(1,2), times=c(100,100))
y <- c(rnorm.mixt(n=100, mus=1, sigmas=1, props=1),
       rnorm.mixt(n=100, mus=-1, sigmas=1, props=1))

kda.gr <- kda(x, x.gr, hs=sqrt(c(0.09, 0.09)), y=y)


### bivariate example - restricted iris dataset  
library(MASS)
data(iris)
ir <- iris[,1:2]
ir.gr <- iris[,5]

H <- Hkda(ir, ir.gr, bw="plugin", pre="scale")
kda.gr <- kda(ir, ir.gr, Hs=H, y=ir)


### multivariate example - full iris dataset
ir <- iris[,1:4]
ir.gr <- iris[,5]

H <- Hkda(ir, ir.gr, bw="plugin", pre="scale")
kda.gr <- kda(ir, ir.gr, Hs=H, y=ir)

Run the code above in your browser using DataLab